545 research outputs found

    Study of Bc−B_{c}^{-} →{\to} J/ψπ−J/{\psi}{\pi}^{-}, ηcπ−{\eta}_{c}{\pi}^{-} Decays with QCD Factorization

    Full text link
    The BcB_{c} →{\to} J/ψπJ/{\psi}{\pi}, ηcπ{\eta}_{c}{\pi} decays are studied in the scheme of the QCD factorization approach. The branching ratios are calculated with the asymptotic distribution amplitude of the pion. The charm quark mass effect is considered. We find that the mass effect on the branching ratios is small.Comment: 20 pages, 3 figures, 3 table

    Curriculum Reform and Practice Exploration of "Foundation of Innovation and Entrepreneurship" for College Normal Majors

    Get PDF
    According to the statistics of the Ministry of Education, the scale of ordinary college graduates in 2023 is expected to reach 11.58 million, an increase of 820,000 compared with 2022 (General Office of the State Council of China, 2015). This figure once again hit a record high in employment, and the employment situation is becoming more and more severe. All walks of life in society have higher and higher expectations for the comprehensive ability of college students, and college students are facing greater pressure and challenges in the process of job-hunting. This paper will discuss the curriculum design, teaching mode, education and teaching reform strategy, experimental research, conclusions and prospects, so as to provide some useful references for the education and teaching reform of innovation and entrepreneurship management courses for college students, and improve the quality of graduate training and employment competitiveness

    Loss Rank Mining: A General Hard Example Mining Method for Real-time Detectors

    Full text link
    Modern object detectors usually suffer from low accuracy issues, as foregrounds always drown in tons of backgrounds and become hard examples during training. Compared with those proposal-based ones, real-time detectors are in far more serious trouble since they renounce the use of region-proposing stage which is used to filter a majority of backgrounds for achieving real-time rates. Though foregrounds as hard examples are in urgent need of being mined from tons of backgrounds, a considerable number of state-of-the-art real-time detectors, like YOLO series, have yet to profit from existing hard example mining methods, as using these methods need detectors fit series of prerequisites. In this paper, we propose a general hard example mining method named Loss Rank Mining (LRM) to fill the gap. LRM is a general method for real-time detectors, as it utilizes the final feature map which exists in all real-time detectors to mine hard examples. By using LRM, some elements representing easy examples in final feature map are filtered and detectors are forced to concentrate on hard examples during training. Extensive experiments validate the effectiveness of our method. With our method, the improvements of YOLOv2 detector on auto-driving related dataset KITTI and more general dataset PASCAL VOC are over 5% and 2% mAP, respectively. In addition, LRM is the first hard example mining strategy which could fit YOLOv2 perfectly and make it better applied in series of real scenarios where both real-time rates and accurate detection are strongly demanded.Comment: 8 pages, 6 figure

    SDA: Simple Discrete Augmentation for Contrastive Sentence Representation Learning

    Full text link
    Contrastive learning methods achieve state-of-the-art results in unsupervised sentence representation learning. Although playing essential roles in contrastive learning, data augmentation methods applied on sentences have not been fully explored. Current SOTA method SimCSE utilizes a simple dropout mechanism as continuous augmentation which outperforms discrete augmentations such as cropping, word deletion and synonym replacement. To understand the underlying rationales, we revisit existing approaches and attempt to hypothesize the desiderata of reasonable data augmentation methods: balance of semantic consistency and expression diversity. Based on the hypothesis, we propose three simple yet effective discrete sentence augmentation methods, i.e., punctuation insertion, affirmative auxiliary and double negation. The punctuation marks, auxiliaries and negative words act as minimal noises in lexical level to produce diverse sentence expressions. Unlike traditional augmentation methods which randomly modify the sentence, our augmentation rules are well designed for generating semantically consistent and grammatically correct sentences. We conduct extensive experiments on both English and Chinese semantic textual similarity datasets. The results show the robustness and effectiveness of the proposed methods

    Parameter-free Dynamic Graph Embedding for Link Prediction

    Full text link
    Dynamic interaction graphs have been widely adopted to model the evolution of user-item interactions over time. There are two crucial factors when modelling user preferences for link prediction in dynamic interaction graphs: 1) collaborative relationship among users and 2) user personalized interaction patterns. Existing methods often implicitly consider these two factors together, which may lead to noisy user modelling when the two factors diverge. In addition, they usually require time-consuming parameter learning with back-propagation, which is prohibitive for real-time user preference modelling. To this end, this paper proposes FreeGEM, a parameter-free dynamic graph embedding method for link prediction. Firstly, to take advantage of the collaborative relationships, we propose an incremental graph embedding engine to obtain user/item embeddings, which is an Online-Monitor-Offline architecture consisting of an Online module to approximately embed users/items over time, a Monitor module to estimate the approximation error in real time and an Offline module to calibrate the user/item embeddings when the online approximation errors exceed a threshold. Meanwhile, we integrate attribute information into the model, which enables FreeGEM to better model users belonging to some under represented groups. Secondly, we design a personalized dynamic interaction pattern modeller, which combines dynamic time decay with attention mechanism to model user short-term interests. Experimental results on two link prediction tasks show that FreeGEM can outperform the state-of-the-art methods in accuracy while achieving over 36X improvement in efficiency. All code and datasets can be found in https://github.com/FudanCISL/FreeGEM.Comment: 19 pages, 9 figures, 13 tables, Thirty-Sixth Conference on Neural Information Processing Systems (NeurIPS 2022), preprint versio
    • …
    corecore